159 research outputs found

    Dynamics in a cluster under the influence of intense femtosecond hard x-ray pulses

    Full text link
    In this paper we examine the behavior of small cluster of atoms in a short (10-50 fs) very intense hard x-ray (10 keV) pulse. We use numerical modeling based on the non-relativistic classical equation of motion. Quantum processes are taken into account by the respective cross sections. We show that there is a Coulomb explosion, which has a different dynamics than one finds in classical laser driven cluster explosions. We discuss the consequences of our results to single molecule imaging by the free electron laser pulses.Comment: 14 pages, 13 figure

    Measuring and simulating magnetic characteristics using epstein frame

    Get PDF
    The paper discusses the standard of the Epstein frame that has been used to measure magnetic characteristics of the core made of material M250-35A supplied by different frequencies between 1-400 Hz. The measuring program has been built in LabVIEW including a control, filter and data save section as well. COMSOL Multiphysics 4.3b has been chosen as simulation environment, in which the Jiles-Atherton hysteresis model has been implemented

    A vidĂ©k- Ă©s agrÃ¥rfejlesztĂ©s stratĂ©giai irÃ¥nyai a fenntartható fejlÅ‘dĂ©s tĂƒÂŒkrĂ©ben

    Get PDF
    A Nemzeti Fenntartható FejlÅ‘dĂ©si StratĂ©giÃ¥ban megjelenÅ‘, hazÃ¥nk mezÅ‘gazdasÃ¥gÃ¥t Ă©rintÅ‘ cĂ©lkitűzĂ©sek összhangban Ã¥llnak az Európai Unió 2007-2013 közötti költsĂ©gvetĂ©si idÅ‘szakÃ¥nak mezÅ‘gazdasÃ¥gi tÃ¥mogatÃ¥sait megalapozó Ãƥj MagyarorszÃ¥g VidĂ©kfejlesztĂ©si StratĂ©giai Terv, azon tÃÂșlmenÅ‘en az ennek alapjÃ¥n kidolgozott Ãƥj MagyarorszÃ¥g VidĂ©kfejlesztĂ©si Program fÅ‘bb cĂ©lkitűzĂ©seivel. Az Ãƥj MagyarorszÃ¥g VidĂ©kfejlesztĂ©si Program a hosszÃÂș tÃ¥vÃÂș fenntarthatósÃ¥gi elvekkel összhangban kívÃ¥nja biztosítani a mezÅ‘gazdasÃ¥g fejlesztĂ©sĂ©hez, a vidĂ©k környezeti Ă©rtĂ©keinek megÅ‘rzĂ©sĂ©hez, a vidĂ©ki tĂ©rsĂ©gek gazdasÃ¥gÃ¥nak megerÅ‘södĂ©sĂ©hez Ă©s a vidĂ©ki tÃ¥rsadalom kohĂ©ziójÃ¥hoz szĂƒÂŒksĂ©ges fejlesztĂ©si kereteket. ------------------------------------------ The goals set out in the National Sustainable Development Strategy for Hungarian agriculture are in line with the New Hungary Rural Development Strategic Plan, on which agricultural grants in the 2007-2013 financial period of the European Union are based, as well as, more specifically, with the main goals set out in the New Hungary Rural Development Program developed on the basis of the Strategic Plan. The New Hungary Rural Development Program is intended as a framework for the development of agriculture, preservation of rural natural treasures, improvement of rural economies and the strengthening of cohesion in rural communities, all in harmony with the principles of long-term sustainability.vidĂ©kfejlesztĂ©s, stratĂ©giai cĂ©lok, fenntarthatósÃ¥g, intĂ©zkedĂ©sek, rural development, strategic plan, sustainable, policy, Agricultural and Food Policy, Community/Rural/Urban Development,

    Querying Semantic Web Resources Using TRIPLE Views

    Get PDF
    Resources on the Semantic Web are described by metadata based on some formal or informal ontology. It is a common situation that casual users are not familiar with a domain ontology in detail. This makes it difficult for such users (or their user tools) to formulate queries to find the relevant resources. Users consider the resources in their specific context, so the most straightforward solution is to formulate queries in an ontology that corresponds to a user-specific view. We present an approach based on multiple views expressed in ontologies simpler than the domain ontology. This allows users to query heterogeneous data repositories in terms of multiple, relatively simple, view ontologies. Ontology developers can define such view ontologies and the corresponding mapping rules. These ontologies are represented in Semantic Web ontology languages such as RDFS, DAML+OIL, or OWL. We present our approach with examples from the e-learning domain using the Semantic Web query and transformation language TRIPLE

    Cost and Quality Assurance in Crowdsourcing Workflows (Extended Abstract)

    Get PDF
    International audienceDespite recent advances in artificial intelligence and machine learning, many tasks still require human contributions. With the growing availability of Internet, it is now possible to hire workers on crowdsourcing marketplaces. Many crowdsourcing platforms have emerged in the last decade: Amazon Mechanical Turk, Figure Eight 2 , Wirk 3 , etc. A platform allows employers to post tasks, that are then realized by workers hired from the crowd in exchange for some incentives [3, 19]. Common tasks include image annotation, surveys, classification, recommendation, sentiment analysis, etc. [7]. The existing platforms support simple, repetitive and independent micro-tasks which require a few minutes to an hour to complete. However, many real-world problems are not simple micro-tasks, but rather complex orchestrations of dependent tasks, that process input data and collect human expertize. Existing platforms provide interfaces to post micro-tasks to a crowd, but cannot handle complex tasks. The next stage of crowdsourcing is to build systems to specify and execute complex tasks over existing crowd platforms. A natural solution is to use workflows, i.e., orchestrations of phases that exchange data to achieve a final objective. Figure 1 is an example of complex workflow depicting the image annotation process on SPIPOLL [5], a platform to survey populations of pollinating insects. Contributors take pictures of insects that are then classified by crowdworkers. Pictures are grouped in a dataset , input to node 0. is filtered to eliminate bad pictures (fuzzy, blurred,...) in phase 0. The remaining pictures are sent to workers who try to classify them. If classification is too difficult, the image is sent to an expert. Initial classification is represented by phase 1 in the workflow, and expert classification by 2. Pictures that were discarded, classified easily or studied by experts are then assembled in a result dataset in phase , to do statistics on insect populations. Workflows alone are not sufficient to crowdsource complex tasks. Many data-centric applications come with budget and quality constraints: As human workers are prone to errors, one has to hire several workers to aggregate a final answer with sufficient confidence. An unlimited budget allows hiring large pools of workers to assemble reliable answers for each micro-task, but in general, a client for a complex task has a limited budget. This forces to replicate micro-tasks in an optimal way to achieve the best possible quality, but without exhausting the given budget. The objective is hence to obtain a reliable result, forged through a complex orchestration, at a reasonable cost. Several works consider data centric models, deployment on crowdsourcing platforms, and aggregation techniques to improve data quality (see [11] for a more complete bibliography). First, coordination of tasks has been considered in languages such as BPM

    Clock-G: A temporal graph management system with space-efficient storage technique

    Get PDF
    International audienceIoT applications can be naturally modeled as a graph where the edges represent the interactions between devices, sensors, and their environment. Thing'in 1 is a platform, initiated by Orange 2. The platform manages a graph of millions of connected and non-connected objects using a commercial graph database. The graph of Thing'in is dynamic because IoT devices create temporary connections between each other. Analyzing the history of these connections paves the way to new promising applications such as object tracking, anomaly detection, and forecasting the future behavior of devices. However, existing commercial graph databases are not designed with native temporal support which limits their usability in such use cases. In this paper, we discuss the design of a temporal graph management system Clock-G and introduce a new space-efficient storage technique ÎŽ-Copy+Log. Clock-G is designed by the developers of the Thing'in platform and is currently being deployed into production. It differentiates from existing temporal graph management systems by adopting the ÎŽ-Copy+Log technique. This technique targets the mitigation of the apparent trade-off between the conflicting goals of the reduction of space usage and acceleration of query execution time. Our experimental results demonstrate that the ÎŽ-Copy+Log presents an overall better performance as compared to traditional storage methods in terms of space usage and query evaluation time

    RTGEN : A Relative Temporal Graph GENerator

    Get PDF
    International audienceGraph management systems are emerging as an efficient solution to store and query graph-oriented data. To assess the performance and compare such systems, practitioners often design benchmarks in which they use large scale graphs. However, such graphs either do not fit the scale requirements or are not publicly available. This has been the incentive of a number of graph generators which produce synthetic graphs whose characteristics mimic those of real-world graphs (degree distribution, community structure, diameter, etc.). Applications, however, require to deal with temporal graphs whose topology is in constant change. Although generating static graphs has been extensively studied in the literature, generating temporal graphs has received much less attention. In this work, we propose RTGEN a relative temporal graph generator that allows the generation of temporal graphs by controlling the evolution of the degree distribution. In particular, we propose to generate new graphs with a desired degree distribution out of existing ones while minimizing the efforts to transform our source graph to target. Our proposed relative graph generation method relies on optimal transport methods. We extend our method to also deal with the community structure of the generated graphs that is prevalent in a number of applications. Our generation model extends the concepts proposed in the Chung-Lu model with a temporal and community-aware support. We validate our generation procedure through experiments that prove the reliability of the generated graphs with the ground-truth parameters
    • 

    corecore